Deep Nearest Class Mean Classifiers
نویسندگان
چکیده
In this paper we introduce DeepNCM, a Nearest Class Mean classification method enhanced to directly learn highly non-linear deep (visual) representations of the data. To overcome the computational expensive process of recomputing the class means after every update of the representation, we opt for approximating the class means with an online estimate. Moreover, to allow the class means to follow closely the drifting representation we introduce per epoch mean condensation. Using online class means with condensation, DeepNCM can train efficiently on large datasets. Our (preliminary) experimental results indicate that DeepNCM performs on par with SoftMax optimised networks.
منابع مشابه
From classifiers to discriminators: A nearest neighbor rule induced discriminant analysis
The current discriminant analysis method design is generally independent of classifiers, thus the connection between discriminant analysis methods and classifiers is loose. This paper provides a way to design discriminant analysis methods that are bound with classifiers. We begin with a local mean based nearest neighbor (LM-NN) classifier and use its decision rule to supervise the design of a d...
متن کاملDeep Nearest Class Mean Model for Incremental Odor Classification
In recent years, more and more machine learning algorithms have been applied to odor recognition. These odor recognition algorithms usually assume that the training dataset is static. However, for some odor recognition tasks, the odor dataset is dynamically growing where not only the training samples but also the number of classes increase over time. Motivated by this concern, we proposed a dee...
متن کاملSymbolic Nearest Mean Classifiers
The minimum-distance classifier summarizes each class with a prototype and then uses a nearest neighbor approach for classification. Three drawbacks of the original minimum-distance classifier are its inability to work with symbolic attributes, weigh attributes, and learn more than a single prototype for each class. The proposed solutions to these problems include defining the mean for symbolic...
متن کاملLazy Classifiers Using P-trees
Lazy classifiers store all of the training samples and do not build a classifier until a new sample needs to be classified. It differs from eager classifiers, such as decision tree induction, which build a general model (such as a decision tree) before receiving new samples. K-nearest neighbor (KNN) classification is a typical lazy classifier. Given a set of training data, a knearest neighbor c...
متن کاملFeature scaling in support vector data description
When in a classification problem only samples of one class are easily accessible, this problem is called a one-class classification problem. Many standard classifiers, like backpropagation neural networks, fail on this data. Some other classifiers, like k-means clustering or nearest neighbor classifier can be applied after some minor changes. In this paper we focus on the support vector data de...
متن کامل